Golang Job: Binancian Accelerator Programme- Java Data Backend

Job added on

Company

Binance

Location

New York - Canada

Job type

Full-Time

Golang Job Details

Binance is the global blockchain company behind the world’s largest digital asset exchange by trading volume and users, serving a greater mission to accelerate cryptocurrency adoption and increase the freedom of money.

Are you looking to be a part of the most influential company in the blockchain industry and contribute to the crypto-currency revolution that is changing the world?

We are looking for several talented Data Engineer interns to join our Risk team. He/She will have the chance to be involved in the development of the Data Engineering lifecycle implementations and enhancement over cloud-based large-scale distributed architectures and work in a cross-national team with various stakeholders including but not limited to software developers (Backend, Data Warehouse Engineer, Data Analyst, Data Scientist) and fellow Sr. Data Backend Developers.

Responsibilities

    • Work as part of the Data Services team to maintain and level up the Data Services/Infrastructure/Pipeline/Portal for extending its capability, improving robustness and quality, and optimizing for performance
    • Develop solutions to drive optimization of data-related operations around cloud services management, time to market, security, privacy, and delivery of worldwide capability
    • Troubleshoot and resolve defects

Requirements

    • Must be currently enrolled in an undergraduate, master's, or Ph.D. program, ideally in Computer Science or Engineering-related
    • Experienced in at least one software programming language- Java (highly preferred), Scala, Python, Golang, or other OOP languages
    • Good computer knowledge, systematic understanding of the operating system, APIs design, databases, data structures, architecture design, etc
    • Concrete SQL skills, such as the principle of SQL execution under different frameworks, and familiarity with structured and unstructured analysis tools of big data
    • Fundamental experience in big data ecological technology stacks (HDFS, Hive, Elastic-search, HBase, Impala, Spark/Flink, Kafka, Airflow, Sqoop, etc.)
    • Experience in extracurricular activities such as hackathons, and open-source projects is a big plus